Joint Probability

Joint Probability
A statistical measure where the likelihood of two events occurring together and at the same point in time are calculated. Joint probability is the probability of event Y occurring at the same time event X occurs.

Notation for joint probability takes the form:

Joint Probability

Joint probability is a measure of two events happening at the same time, and can only be applied to situations where more than one observation can be occurred at the same time.

For example, a joint probability can not be calculated when tossing a coin on the same flip. However, the joint probability can be calculated on the probability of rolling a 2 and a 5 using two different dice.


Investment dictionary. . 2012.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Joint probability density function — may refer to:* Probability density function * Joint probability distribution …   Wikipedia

  • Joint probability distribution — In the study of probability, given two random variables X and Y that are defined on the same probability space, the joint distribution for X and Y defines the probability of events defined in terms of both X and Y. In the case of only two random… …   Wikipedia

  • joint probability — noun the probability of two events occurring together • Hypernyms: ↑probability, ↑chance …   Useful english dictionary

  • Probability — is the likelihood or chance that something is the case or will happen. Probability theory is used extensively in areas such as statistics, mathematics, science and philosophy to draw conclusions about the likelihood of potential events and the… …   Wikipedia

  • Probability density function — Boxplot and probability density function of a normal distribution N(0, σ2). In probability theory, a probability density function (pdf), or density of a continuous random variable is a function that describes the relative likelihood for this… …   Wikipedia

  • Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like …   Wikipedia

  • Probability metric — A probability metric is a function defining a distance between random variables or vectors. In particular the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric… …   Wikipedia

  • probability theory — Math., Statistics. the theory of analyzing and making statements concerning the probability of the occurrence of uncertain events. Cf. probability (def. 4). [1830 40] * * * Branch of mathematics that deals with analysis of random events.… …   Universalium

  • Joint quantum entropy — The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint… …   Wikipedia

  • Probability distribution — This article is about probability distribution. For generalized functions in mathematical analysis, see Distribution (mathematics). For other uses, see Distribution (disambiguation). In probability theory, a probability mass, probability density …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”